BScNets: Block Simplicial Complex Neural Networks
نویسندگان
چکیده
Simplicial neural networks (SNNs) have recently emerged as a new direction in graph learning which expands the idea of convolutional architectures from node space to simplicial complexes on graphs. Instead predominantly assessing pairwise relations among nodes current practice, allow us describe higher-order interactions and multi-node structures. By building upon connection between convolution operation block Hodge-Laplacian, we propose first SNN for link prediction. Our Block Complex Neural Networks (BScNets) model generalizes existing network (GCN) frameworks by systematically incorporating salient multiple structures different dimensions. We discuss theoretical foundations behind BScNets illustrate its utility prediction eight real-world synthetic datasets. experiments indicate that outperforms state-of-the-art models significant margin while maintaining low computation costs. Finally, show promising alternative tracking spread infectious diseases such COVID-19 measuring effectiveness healthcare risk mitigation strategies.
منابع مشابه
Block-based neural networks
This paper presents a novel block-based neural network (BBNN) model and the optimization of its structure and weights based on a genetic algorithm. The architecture of the BBNN consists of a 2D array of fundamental blocks with four variable input/output nodes and connection weights. Each block can have one of four different internal configurations depending on the structure settings, The BBNN m...
متن کاملBlock-structured recurrent neural networks
This paper introduces a new class of dynamic multi layer perceptrons, called Block Feedback Neural Networks (B F N). B F N have been developed to provide a systematic way to build networks of high complexity, including networks with coupled loops, nested loops and so on. B F Ns are speciied using a block notation. Any B F N can be seen as a block and connected to other B F Ns using a xed number...
متن کاملLecture 5: Simplicial Complex 2-Manifolds, Simplex and Simplicial Complex
Figure 1: Two greatly different curves have a small Hausdroff distance Fréchet distance is a good similarity measurement for curves in Euclidean space. It can be simply described by a daily example. Suppose a dog and its owner are walking along two different paths (curves), connected by a leash. Both of them are moving continuously and forwards only, at any speed or even stop. Then length of th...
متن کاملBlock-Sparse Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are used in state-of-the-art models in domains such as speech recognition, machine translation, and language modelling. Sparsity is a technique to reduce compute and memory requirements of deep learning models. Sparse RNNs are easier to deploy on devices and high-end server processors. Even though sparse operations need less compute and memory relative to their ...
متن کاملrodbar dam slope stability analysis using neural networks
در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...
ذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2022
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v36i6.20583